Mississippi Deploys IBM-Based Homeland Security Infrastructure
IBM has announced the successful deployment of the
Mississippi Automated System Project (ASP), a mobile data infrastructure
designed to allow local law enforcement and public safety agencies access to
critical information stored within a single database. When complete, ASP will
provide mobile units realtime access to information including mug shots, arrest
warrants, criminal intelligence, hazardous materials data, and medical
emergency protocols. Through the efforts of U.S. Senators Thad Cochran (R-MS)
and Trent Lott (R-MS), the University of Southern Mississippi secured $14
million in federal grants for a pilot project supporting all law enforcement,
fire department, and emergency medical services within Hancock, Harrison, and
Jackson counties. Each datacenter network will utilize one IBM eServer iSeries
825 and two eServer xSeries 445 systems running
Tarantella Secure Global Desktop Enterprise Edition remote access software,
Novell’s SuSE Linux, and IBM DB2. Each datacenter
will be linked to an identical datacenter at a separate location to provide
redundancy and prevent single points of failure. The system in these counties
will be rolled out in three distinct phases, the last of which is expected to
complete in October 2004. As the ASP expands, networks will be linked together
to join multiple jurisdictions into a single centralized information source.
Since September 11, 2001, how IT can improve public safety
and national security has been the subject of ongoing discussions at local,
state, and federal levels. Beginning with the legislation that created the
Homeland Security Department, some of these efforts’ reach have far exceed
their grasp. For example, the Total Information Awareness (TIA) program led by
Admiral John Poindexter (of Iran-Contra fame), planned to utilize advanced data
mining technologies to rout out evil doers by scrutinizing millions of
citizens’ private information. While some TIA efforts continue, though with a
much lower profile since Poindexter’s resignation last year, most qualify as R&D
experiments whose influence will not be felt for years or even decades, if
ever.
More recently, the commission investigating the 9/11 terrorist attacks discovered severe breakdowns in communications technologies and protocols that profoundly affected the ability of law enforcement and public safety workers to protect the public. These issues, which ranged from archaic report filing procedures in FBI field offices to failures in New York City’s emergency response network, suggest that while an overarching strategy for dealing with terrorism and security can be valuable, the real job of protecting people occurs from city to city and street to street. Given that, we see Mississippi’s prototype ASP as a notable example of how IT can enhance public safety and national security. By leveraging IBM hardware and middleware, along with Novell’s SuSE Linux and Tarantella’s remote access software, to systematize mobile access to integrated databases, the state is putting critical realtime information, be it criminal records or hazardous materials methodologies, into the hands of the people who need it most and can use it best. To paraphrase the late House Speaker Tip O’Neill, all safety is local. Mississippi, with considerable assistance from IBM and others, is using IT to help ensure the well-being of its citizens wherever they reside.
Demand Growth for On Demand Computing
Computerworld has released results from its online survey
of 765 IT professionals gauging their perceptions about on demand computing. The survey found that while 64% of respondents indicated
some degree of skepticism about on demand computing, citing concerns about
cost, vendor lock-in, and security, 56% felt that on demand computing would
have an impact on their operations in the next few years. Thirty-two percent of
respondents have implemented on demand computing or are seriously evaluating
it, and 30% have budgeted funds for on demand technologies or services in 2004
or 2005. The survey collected statistics on on demand
computing, utility computing, adaptive computing, and grid computing.
Twenty-nine percent of respondents indicated that on demand was most important.
This is up three times the importance ascribed to utility, adaptive, grid, or a
combination of all the paradigms. Of the various vendors (IBM, SUN, HP, Oracle,
EMC, etc.) that have been associated with these initiatives, IBM ranked the
highest, in the 40th percentile. The highest percentage for any of the other
vendors did not exceed 16%.
From this latest survey we see that overall, IT professionals remain skeptical about on demand computing. On demand is one of the more complex IT industry evolutions and is painted against the backdrop where reality has not always matched the expectations set by vendor hype. Nonetheless, this skepticism is also met with a sense that on demand computing will have a substantial impact on IT operations in the future. Yet the number of enterprises that are seriously exploring on demand or have pilot projects underway is only approximately one-third of the respondents. Given these seemingly contrary positions, it seems that on demand computing is facing more than just a perceived complexity issue: it is not a clearly articulated message that resonates with a majority of the survey sample. We believe this to be unfortunate, as the notion of on demand computing should be very compelling to any organization that has more than one computer system. This is ironic as the simplicity for users and managers of IT solutions purported by demand-driven computing would be a boon to an organization of any size, especially mid tier and smaller ones. Yet the messaging around demand-driven computing seems to convey a different computing universe than the one most mid tier and smaller enterprises believe they have inhouse. While IBM will undoubtedly be pleased that its moniker came out on top, it does appear the notion of on demand remains beyond the grasp of a substantial portion of the marketplace: a missed opportunity no matter how one looks at it.
The U.S. House of Representatives has approved a defense spending
bill that would restrict the export of computers to a much tighter degree than
present export controls. The House version of the defense spending bill has to
be reconciled with the Senate version of the spending package, which does not
include the language that would tighten export restrictions. That
reconciliation is expected to happen in joint Senate-House conference committee
meetings later this month. Under the terms of the House version of the export
controls, nothing could be exported that was as powerful or more powerful than
a Pentium III at 650 MHz. Current export controls require computer companies
get a license to sell any computer with 32 Intel Itanium processors or greater
computing power.
This latest legislative attempt to limit the amount of
computing power shipped out of the country no doubt will pit defense hawks
against computer industry interests that would see the exportable computing
power fall to 1999 desktop levels. While such models may be in high demand in
schools, libraries, or other institutions seeking access to computers for
little or no cost to replace more antiquated machines, we suspect there will be
little interest in such technology from offshore private or public sectors. Perhaps
more importantly, we believe there would certainly be less interest in selling
such technology by computer vendors. Defense Department hawks argue that the
Commerce Department has not been properly vigilant in preventing exports of
technology that could aid in the development of weapons of mass destruction and
assist in developing enhance encryption systems.
While those hawks hope that the House legislation will pass and help slow down the proliferation of WMDs, we believe what the Defense Department needs is a time machine that would allow them to go back to 1999, at which point the Department could shut down all exports from that point forward. Such a move would have a devastating effect on U.S. computer vendors and it would enlarge the technology gap between the U.S. and the rest of the world, at least temporarily. Of course, nature abhors a vacuum, so offshore computer makers would find themselves free to take over non-U.S. markets with no competition from the U.S. As we have seen in the past five years, those markets are booming. Without that time machine any efforts to limit the proliferation of valuable computing power in the year 2004 seems very much like closing the barn door after the cows have already left the farm, since those five years have seen the allowable export of machines much more powerful than a Pentium III. The backers of export controls also appear to have little grasp of the evolving nature of computing: even those Pentium IIIs could be used in clustered situations to provide the necessary computing power to simulate a nuclear blast, for example. Hopefully, this clause in the defense spending bill will die in conference committee, where cooler, and more informed, heads will prevail.
The Elephant in the Room: Intel Launches New Workstation Processors/Chipsets
Intel has announced availability of a new Xeon
processor-based platform that utilizes technologies the company said will
significantly boost performance, memory, and graphics capabilities for
workstations. Intel indicated that the new Xeon chips would particularly
benefit applications in areas such as such as financial and scientific data
modeling, digital filmmaking, and design automation. The new Intel Xeon
processor (formerly codenamed "Nocona") integrates technologies
including DDR2 memory, PCI Express, a faster 800MHz system bus, Intel Extended
Memory 64 Technology, and enhanced Intel SpeedStep.
In addition, the processor takes advantage of improvements in Intel’s
Hyper-Threading and Streaming SIMD Extensions 3 (SSE3) Instructions to benefit system
performance and responsiveness. The new Intel E7525 chipset (formerly codenamed
"Tumwater") integrates several new technologies that eliminate system
bottlenecks by balancing performance between the processor, I/O, and memory.
The new Xeon processors are available now at speeds ranging from 2.80 to 3.60
GHz at list prices in quantities of 1,000 ranging from: $209 to $851. The E7525
chipset is available at $100 based on 1,000 units. In addition, Intel stated
that a new Xeon processor for 2-way server platforms that takes advantage of
the same technologies as the new workstation processor would be available in
coming months. Vendors including Dell, IBM, HP, Fujitsu, and NEC stated they
would offer solutions based on the new processors.
While Intel processor announcements are hardly new, the
new Xeon chip integrates a number of enhancements that will be of particular
interest to technical workstation users. Considering the growing use of
x86-based workstations for applications that were once rarified turf, oiling up
the wheels of the new Xeon is likely to bring additional pressure to bear on
SGI, Sun, and other workstation specialists. But the fact is that this
announcement is more notable for what it obscures as what it illuminates. In
mid-February’s Intel Developer Forum, Intel quietly discussed plans to deliver
Xeon chips that incorporated 64-bit extensions technologies like those featured
in AMD’s popular Opteron processors. The decision to
follow AMD’s lead was big news at the time, and is likely why the presence of
these technologies in Nocona is buried between SpeedStep
and Hyper-Threading in this announcement.
Why treat 64-bit extensions like the venerable elephant in the room, which everyone recognizes but no one discusses? For a couple of very good reasons, actually. First, following AMD in any way has to be a very bitter pill for Intel, which prides itself in leading the pack in every case, and is made doubly so by the fact that by all reports Intel once considered 64-bit extensions and decided to throw its energies into developing Itanium, instead. More importantly, Intel is now stuck between a curious rock and a very hard place, the former being taking advantage of the enthusiastic reception of 64-bit extensions and the latter being preserving face (and market share) for Itanium at any cost. Overall, Intel is likely to have more luck with the first effort. We expect the company to come down with both feet (as usual) in marketing these new Xeon solutions, though we also expect to hear how their success is the result of multiple bells rather than a single 64-bit whistle. Where this will leave Itanium is anyone’s guess. IA-64 is unlikely to fail outright or even go away anytime soon, but Intel’s careful positioning of Itanium as a high-end HPC play, so notably different from their original general purpose server pitch, is likely to push future generations of Itanium into a smaller and smaller niche. At the end of the day, Opteron and now Nocona should stand as simple lessons of the fact that customers make markets, not vendors.